Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
Psychol Res ; 2024 Jan 12.
Article in English | MEDLINE | ID: mdl-38214774

ABSTRACT

A vast body of research suggests that the primary motor cortex is involved in motor imagery. This raises the issue of inhibition: how is it possible for motor imagery not to lead to motor execution? Bach et al. (Psychol Res Psychol Forschung. 10.1007/s00426-022-01773-w, 2022, this issue) suggest that the motor execution threshold may be "upregulated" during motor imagery to prevent execution. Alternatively, it has been proposed that, in parallel to excitatory mechanisms, inhibitory mechanisms may be actively suppressing motor output during motor imagery. These theories are verbal in nature, with well-known limitations. Here, we describe a toy-model of the inhibitory mechanisms thought to be at play during motor imagery to start disentangling predictions from competing hypotheses.

2.
Cortex ; 169: 161-173, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37922641

ABSTRACT

Humans have the ability to mentally examine speech. This covert form of speech production is often accompanied by sensory (e.g., auditory) percepts. However, the cognitive and neural mechanisms that generate these percepts are still debated. According to a prominent proposal, inner speech has at least two distinct phenomenological components: inner speaking and inner hearing. We used transcranial magnetic stimulation to test whether these two phenomenologically distinct processes are supported by distinct neural mechanisms. We hypothesised that inner speaking relies more strongly on an online motor-to-sensory simulation that constructs a multisensory experience, whereas inner hearing relies more strongly on a memory-retrieval process, where the multisensory experience is reconstructed from stored motor-to-sensory associations. Accordingly, we predicted that the speech motor system will be involved more strongly during inner speaking than inner hearing. This would be revealed by modulations of TMS evoked responses at muscle level following stimulation of the lip primary motor cortex. Overall, data collected from 31 participants corroborated this prediction, showing that inner speaking increases the excitability of the primary motor cortex more than inner hearing. Moreover, this effect was more pronounced during the inner production of a syllable that strongly recruits the lips (vs. a syllable that recruits the lips to a lesser extent). These results are compatible with models assuming that the primary motor cortex is involved during inner speech and contribute to clarify the neural implementation of the fundamental ability of silently speaking in one's mind.


Subject(s)
Speech Perception , Humans , Speech Perception/physiology , Hearing , Speech/physiology , Transcranial Magnetic Stimulation/methods , Evoked Potentials, Motor/physiology
3.
Front Hum Neurosci ; 16: 804832, 2022.
Article in English | MEDLINE | ID: mdl-35355587

ABSTRACT

Covert speech is accompanied by a subjective multisensory experience with auditory and kinaesthetic components. An influential hypothesis states that these sensory percepts result from a simulation of the corresponding motor action that relies on the same internal models recruited for the control of overt speech. This simulationist view raises the question of how it is possible to imagine speech without executing it. In this perspective, we discuss the possible role(s) played by motor inhibition during covert speech production. We suggest that considering covert speech as an inhibited form of overt speech maps naturally to the purported progressive internalization of overt speech during childhood. We further argue that the role of motor inhibition may differ widely across different forms of covert speech (e.g., condensed vs. expanded covert speech) and that considering this variety helps reconciling seemingly contradictory findings from the neuroimaging literature.

4.
Brain Cogn ; 155: 105811, 2021 12.
Article in English | MEDLINE | ID: mdl-34737127

ABSTRACT

Coarse information of a visual stimulus is conveyed by Low Spatial Frequencies (LSF) and is thought to be rapidly extracted to generate predictions. This may guide fast recognition with the subsequent integration of fine information, conveyed by High Spatial Frequencies (HSF). In autism, emotional face recognition is challenging, and might be related to alterations in LSF predictive processes. We analyzed the data of 27 autistic and 34 non autistic (NA) adults on an emotional Stroop task (i.e., emotional face with congruent or incongruent emotional word) with spatially filtered primes (HSF vs.LSF). We hypothesized that LSF primes would generate predictions leading to faster categorization of the target face compared to HSF primes, in the NA group but not in autism. Surprisingly, HSF primes led to faster categorization than LSF primes in both groups. Moreover, the advantage of HSF vs.LSF primes was stronger for angry than happy faces in NA, but was stronger for happy than angry faces in autistic participants. Drift diffusion modelling confirmed HSF advantage and showed a longer non-decision time (e.g., encoding) in autism. Despite LSF predictive impairments in autism was not corroborated, our analyses suggest low level processing specificities in autism.


Subject(s)
Autistic Disorder , Facial Recognition , Adult , Emotions , Happiness , Humans , Pattern Recognition, Visual , Photic Stimulation , Recognition, Psychology
5.
Neurosci Biobehav Rev ; 126: 329-337, 2021 07.
Article in English | MEDLINE | ID: mdl-33757817

ABSTRACT

Metacognitive deficits are well documented in schizophrenia spectrum disorders as a decreased capacity to adjust confidence to performance in a cognitive task. Because metacognitive ability directly depends on task performance, metacognitive deficits might be driven by lower task performance among patients. To test this hypothesis, we conducted a Bayesian meta-analysis of 42 studies comparing metacognitive abilities in 1425 individuals with schizophrenia compared to 1256 matched controls. We found a global metacognitive deficit in schizophrenia (g = -0.57, 95 % CrI [-0.72, -0.43]), which was driven by studies which did not control task performance (g = -0.63, 95 % CrI [-0.78, -0.49]), and inconclusive among controlled-studies (g = -0.23, 95 % CrI [-0.60, 0.16], BF01 = 2.2). No correlation was found between metacognitive deficit and clinical features. We provide evidence that the metacognitive deficit in schizophrenia is inflated due to non-equated task performance. Thus, efforts should be made to develop experimental protocols accounting for lower task performance in schizophrenia.


Subject(s)
Metacognition , Schizophrenia , Bayes Theorem , Humans
6.
Int J Psychophysiol ; 159: 23-36, 2021 01.
Article in English | MEDLINE | ID: mdl-33159987

ABSTRACT

Previous research showed that mental rumination, considered as a form of repetitive and negative inner speech, is associated with increased facial muscular activity. However, the relation between these muscular activations and the underlying mental processes is still unclear. In this study, we tried to separate the facial electromyographic correlates of induced rumination related to either i) mechanisms of (inner) speech production or ii) rumination as a state of pondering on negative affects. To this end, we compared two groups of participants submitted to two types of rumination induction (for a total of 85 female undergraduate students without excessive depressive symptoms). The first type of induction was designed to specifically induce rumination in a verbal modality whereas the second one was designed to induce rumination in a visual modality. Following the motor simulation view of inner speech production, we hypothesised that the verbal rumination induction should result in a higher increase of activity in the speech-related muscles as compared to the non-verbal rumination induction. We also hypothesised that relaxation focused on the orofacial area should be more efficient in reducing rumination (when experienced in a verbal modality) than a relaxation focused on a non-orofacial area. Our results do not corroborate these hypotheses, as both rumination inductions resulted in a similar increase of peripheral muscular activity in comparison to baseline levels. Moreover, the two relaxation types were similarly efficient in reducing rumination, whatever the rumination induction. We discuss these results in relation to the inner speech literature and suggest that because rumination is a habitual and automatic form of emotion regulation, it might be a particularly (strongly) internalised and condensed form of inner speech. Pre-registered protocol, preprint, data, as well as reproducible code and figures are available at: https://osf.io/c9pag/.


Subject(s)
Cognition , Speech , Face , Female , Humans , Students
7.
PLoS One ; 15(5): e0233282, 2020.
Article in English | MEDLINE | ID: mdl-32459800

ABSTRACT

Although having a long history of scrutiny in experimental psychology, it is still controversial whether wilful inner speech (covert speech) production is accompanied by specific activity in speech muscles. We present the results of a preregistered experiment looking at the electromyographic correlates of both overt speech and inner speech production of two phonetic classes of nonwords. An automatic classification approach was undertaken to discriminate between two articulatory features contained in nonwords uttered in both overt and covert speech. Although this approach led to reasonable accuracy rates during overt speech production, it failed to discriminate inner speech phonetic content based on surface electromyography signals. However, exploratory analyses conducted at the individual level revealed that it seemed possible to distinguish between rounded and spread nonwords covertly produced, in two participants. We discuss these results in relation to the existing literature and suggest alternative ways of testing the engagement of the speech motor system during wilful inner speech production.


Subject(s)
Electromyography , Muscle, Skeletal/physiology , Phonetics , Thinking/physiology , Brain/physiology , Female , Humans , Pattern Recognition, Automated , Speech/physiology , Young Adult
8.
Psychol Sci ; 31(5): 488-504, 2020 05.
Article in English | MEDLINE | ID: mdl-32271656

ABSTRACT

Previous studies have suggested that action constraints influence visual perception of distances. For instance, the greater the effort to cover a distance, the longer people perceive this distance to be. The present multilevel Bayesian meta-analysis (37 studies with 1,035 total participants) supported the existence of a small action-constraint effect on distance estimation, Hedges's g = 0.29, 95% credible interval = [0.16, 0.47]. This effect varied slightly according to the action-constraint category (effort, weight, tool use) but not according to participants' motor intention. Some authors have argued that such effects reflect experimental demand biases rather than genuine perceptual effects. Our meta-analysis did not allow us to dismiss this possibility, but it also did not support it. We provide field-specific conventions for interpreting action-constraint effect sizes and the minimum sample sizes required to detect them with various levels of power. We encourage researchers to help us update this meta-analysis by directly uploading their published or unpublished data to our online repository ( https://osf.io/bc3wn/ ).


Subject(s)
Distance Perception , Psychomotor Performance , Visual Perception , Bayes Theorem , Humans , Photic Stimulation
9.
J Speech Lang Hear Res ; 62(5): 1225-1242, 2019 05 21.
Article in English | MEDLINE | ID: mdl-31082309

ABSTRACT

Purpose Bayesian multilevel models are increasingly used to overcome the limitations of frequentist approaches in the analysis of complex structured data. This tutorial introduces Bayesian multilevel modeling for the specific analysis of speech data, using the brms package developed in R. Method In this tutorial, we provide a practical introduction to Bayesian multilevel modeling by reanalyzing a phonetic data set containing formant (F1 and F2) values for 5 vowels of standard Indonesian (ISO 639-3:ind), as spoken by 8 speakers (4 females and 4 males), with several repetitions of each vowel. Results We first give an introductory overview of the Bayesian framework and multilevel modeling. We then show how Bayesian multilevel models can be fitted using the probabilistic programming language Stan and the R package brms, which provides an intuitive formula syntax. Conclusions Through this tutorial, we demonstrate some of the advantages of the Bayesian framework for statistical modeling and provide a detailed case study, with complete source code for full reproducibility of the analyses ( https://osf.io/dpzcb /). Supplemental Material https://doi.org/10.23641/asha.7973822.


Subject(s)
Language , Phonation , Speech , Bayes Theorem , Female , Humans , Indonesia , Male , Multilevel Analysis , Phonetics , Sex Characteristics
10.
Front Psychol ; 9: 699, 2018.
Article in English | MEDLINE | ID: mdl-29867666

ABSTRACT

We argue that making accept/reject decisions on scientific hypotheses, including a recent call for changing the canonical alpha level from p = 0.05 to p = 0.005, is deleterious for the finding of new discoveries and the progress of science. Given that blanket and variable alpha levels both are problematic, it is sensible to dispense with significance testing altogether. There are alternatives that address study design and sample size much more directly than significance testing does; but none of the statistical tools should be taken as the new magic method giving clear-cut mechanical answers. Inference should not be based on single studies at all, but on cumulative evidence from multiple independent studies. When evaluating the strength of the evidence, we should consider, for example, auxiliary assumptions, the strength of the experimental design, and implications for applications. To boil all this down to a binary decision based on a p-value threshold of 0.05, 0.01, 0.005, or anything else, is not acceptable.

11.
Biol Psychol ; 127: 53-63, 2017 07.
Article in English | MEDLINE | ID: mdl-28465047

ABSTRACT

Rumination is predominantly experienced in the form of repetitive verbal thoughts. Verbal rumination is a particular case of inner speech. According to the Motor Simulation view, inner speech is a kind of motor action, recruiting the speech motor system. In this framework, we predicted an increase in speech muscle activity during rumination as compared to rest. We also predicted increased forehead activity, associated with anxiety during rumination. We measured electromyographic activity over the orbicularis oris superior and inferior, frontalis and flexor carpi radialis muscles. Results showed increased lip and forehead activity after rumination induction compared to an initial relaxed state, together with increased self-reported levels of rumination. Moreover, our data suggest that orofacial relaxation is more effective in reducing rumination than non-orofacial relaxation. Altogether, these results support the hypothesis that verbal rumination involves the speech motor system, and provide a promising psychophysiological index to assess the presence of verbal rumination.


Subject(s)
Electromyography , Facial Muscles/physiology , Rumination, Cognitive/physiology , Speech/physiology , Anxiety/physiopathology , Female , Forehead/physiology , Humans , Lip/physiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...